The theory based on the camera analogy, however, leaves everything about perception still to be explained. This becomes evident as soon as we compare the way the world appears to us with the images of it projected on our retinas. I have already noted the discrepancy between the shape of the object's image and its perceived shape. In addition to that, however, the image of an object projected on the retina changes continuously as we move and as our position with respect to the object changes, yet we tend to perceive the properties of objects as constant under most viewing conditions. In watching people walking toward or away from us, for instance, we don't see them physically enlarging or shrinking, even though the image on the retina does exactly that. If we tilt our heads to one side in gazing at a building, it doesn't appear to tilt, even though its retinal image does so. As we go from dim, indoor illumination to bright sunlight, the intensity of light reaching the eye can vary by a factor of thousands. Nevertheless, white surfaces look white even in dim light, and black ones look black even in intense light. How can we explain these constancies in our perception? The camera theory would lead us to predict continuous change.